Goto

Collaborating Authors

 machine tool


A Literature Review On Stewart-Gough Platform Calibrations A Literature Review On Stewart-Gough Platform Calibrations

Karmakar, Sourabh, Turner, Cameron J.

arXiv.org Artificial Intelligence

Researchers have studied Stewart-Gough platforms, also known as Gough-Stewart platforms or hexapod platforms extensively for their inherent fine control characteristics. Their studies led to the potential deployment opportunities of Stewart-Gough Platforms in many critical applications such as the medical field, engineering machines, space research, electronic chip manufacturing, automobile manufacturing, etc. Some of these applications need micro and nano-level movement control in 3D space for the motions to be precise, complicated, and repeatable; a Stewart-Gough platform fulfills these challenges smartly. For this, the platform must be more accurate than the specified application accuracy level and thus proper calibration for a parallel robot is crucial. Forward kinematics-based calibration for these hexapod machines becomes unnecessarily complex and inverse kinematics complete this task with much ease. To experiment with different calibration techniques, various calibration approaches were implemented by using external instruments, constraining one or more motions of the system, and using extra sensors for auto or self-calibration. This survey paid attention to those key methodologies, their outcome, and important details related to inverse kinematic-based parallel robot calibrations. It was observed during this study that the researchers focused on improving the accuracy of the platform position and orientation considering the errors contributed by one source or multiple sources. The error sources considered are mainly kinematic and structural, in some cases, environmental factors also are reviewed, however, those calibrations are done under no-load conditions. This study aims to review the present state of the art in this field and highlight the processes and errors considered for the calibration of Stewart-Gough platforms.


Calibration of Parallel Kinematic Machine Based on Stewart Platform-A Literature Review

Karmakar, Sourabh, Patel, Apurva, Turner, Cameron J.

arXiv.org Artificial Intelligence

Stewart platform-based Parallel Kinematic (PKM) Machines have been extensively studied by researchers due to their inherent finer control characteristics. This has opened its potential deployment opportunities in versatile critical applications like the medical field, engineering machines, space research, electronic chip manufacturing, automobile manufacturing, etc. All these precise, complicated, and repeatable motion applications require micro and nano-scale movement control in 3D space; a 6-DOF PKM can take this challenge smartly. For this, the PKM must be more accurate than the desired application accuracy level and thus proper calibration for a PKM robot is essential. Forward kinematics-based calibration for such hexapod machines becomes unnecessarily complex and inverse kinematics complete this task with much ease. To analyze different techniques, an external instrument-based, constraint-based, and auto or self-calibration-based approaches have been used for calibration. This survey has been done by reviewing these key methodologies, their outcome, and important points related to inverse kinematic-based PKM calibrations in general. It is observed in this study that the researchers focused on improving the accuracy of the platform position and orientation considering the errors contributed by a single source or multiple sources. The error sources considered are mainly structural, in some cases, environmental factors are also considered, however, these calibrations are done under no-load conditions. This study aims to understand the current state of the art in this field and to expand the scope for other researchers in further exploration in a specific area.


Data-Driven Temperature Modelling of Machine Tools by Neural Networks: A Benchmark

Coelho, C., Hohmann, M., Fernández, D., Penter, L., Ihlenfeldt, S., Niggemann, O.

arXiv.org Artificial Intelligence

Traditional thermal error correction/compensation methods rely on measured temperature-deformation fields or on transfer functions. Most existing data-driven compensation strategies employ neural networks (NNs) to directly predict thermal errors or specific compensation values. While effective, these approaches are tightly bound to particular error types, spatial locations, or machine configurations, limiting their generality and adaptability. In this work, we introduce a novel paradigm in which NNs are trained to predict high-fidelity temperature and heat flux fields within the machine tool. The proposed framework enables subsequent computation and correction of a wide range of error types using modular, swappable downstream components. The NN is trained using data obtained with the finite element method under varying initial conditions and incorporates a correlation-based selection strategy that identifies the most informative measurement points, minimising hardware requirements during inference. We further benchmark state-of-the-art time-series NN architectures, namely Recurrent NN, Gated Recurrent Unit, Long-Short Term Memory (LSTM), Bidirectional LSTM, Transformer, and Temporal Convolutional Network, by training both specialised models, tailored for specific initial conditions, and general models, capable of extrapolating to unseen scenarios. The results show accurate and low-cost prediction of temperature and heat flux fields, laying the basis for enabling flexible and generalisable thermal error correction in machine tool environments.


TinyML Towards Industry 4.0: Resource-Efficient Process Monitoring of a Milling Machine

Langer, Tim, Widra, Matthias, Beyer, Volkhard

arXiv.org Artificial Intelligence

In the context of industry 4.0, long-serving industrial machines can be retrofitted with process monitoring capabilities for future use in a smart factory. One possible approach is the deployment of wireless monitoring systems, which can benefit substantially from the TinyML paradigm. This work presents a complete TinyML flow from dataset generation, to machine learning model development, up to implementation and evaluation of a full preprocessing and classification pipeline on a microcontroller. After a short review on TinyML in industrial process monitoring, the creation of the novel MillingVibes dataset is described. The feasibility of a TinyML system for structure-integrated process quality monitoring could be shown by the development of an 8-bit-quantized convolutional neural network (CNN) model with 12.59kiB parameter storage. A test accuracy of 100.0% could be reached at 15.4ms inference time and 1.462mJ per quantized CNN inference on an ARM Cortex M4F microcontroller, serving as a reference for future TinyML process monitoring solutions.


A Cutting Mechanics-based Machine Learning Modeling Method to Discover Governing Equations of Machining Dynamics

Ren, Alisa, Ma, Mason, Wu, Jiajie, Karandikar, Jaydeep, Tyler, Chris, Shi, Tony, Schmitz, Tony

arXiv.org Artificial Intelligence

This paper proposes a cutting mechanics-based machine learning (CMML) modeling method to discover governing equations of machining dynamics. The main idea of CMML design is to integrate existing physics in cutting mechanics and unknown physics in data to achieve automated model discovery, with the potential to advance machining modeling. Based on existing physics in cutting mechanics, CMML first establishes a general modeling structure governing machining dynamics, that is represented by a set of unknown differential algebraic equations. CMML can therefore achieve data-driven discovery of these unknown equations through effective cutting mechanics-based nonlinear learning function space design and discrete optimization-based learning algorithm. Experimentally verified time domain simulation of milling is used to validate the proposed modeling method. Numerical results show CMML can discover the exact milling dynamics models with process damping and edge force from noisy data. This indicates that CMML has the potential to be used for advancing machining modeling in practice with the development of effective metrology systems.


Optimized Task Assignment and Predictive Maintenance for Industrial Machines using Markov Decision Process

Nasir, Ali, Mekid, Samir, Sawlan, Zaid, Alsawafy, Omar

arXiv.org Artificial Intelligence

The importance of predictive maintenance is well-recognized in the industrial sector for several reasons, e.g., it allows for the reduction in machine downtime, it helps in reducing the production cost, and it is useful in enhancing the life of machines. Consequently, predictive maintenance is one of the key areas of research among the scientific community. Initially, the predictive maintenance used to be time-based but later on (with the advances in sensing technology), condition-based maintenance (CBM) gained more popularity. Maintenance of machine tools involve two key stages, i.e., diagnosis and prognosis. Prognosis deals with the prediction of remaining useful life (RUL) of the machine whereas diagnosis is concerned with detection and identification of various faults in the machine. Major approaches for prognosis include data-based approaches, knowledge-based approaches, and physics (model) based approaches. Diagnosis on the other hand is based on centralized or distributed approaches [1]. Key challenges in predictive maintenance include 1) Dealing with the noisy sensor data, 2) Uncertainty in the operating conditions, and 3) Diversity of tasks assigned to the machine. A comparison between time-based and condition-based maintenance strategies has been presented in [2].


Sharing Information Between Machine Tools to Improve Surface Finish Forecasting

Clarkson, Daniel R., Bull, Lawrence A., Dardeno, Tina A., Wickramarachchi, Chandula T., Cross, Elizabeth J., Rogers, Timothy J., Worden, Keith, Dervilis, Nikolaos, Hughes, Aidan J.

arXiv.org Artificial Intelligence

At present, most surface-quality prediction methods can only perform single-task prediction which results in under-utilised datasets, repetitive work and increased experimental costs. To counter this, the authors propose a Bayesian hierarchical model to predict surface-roughness measurements for a turning machining process. The hierarchical model is compared to multiple independent Bayesian linear regression models to showcase the benefits of partial pooling in a machining setting with respect to prediction accuracy and uncertainty quantification.


How ChatGPT Will Destabilize White-Collar Work - The Atlantic

#artificialintelligence

In the next five years, it is likely that AI will begin to reduce employment for college-educated workers. As the technology continues to advance, it will be able to perform tasks that were previously thought to require a high level of education and skill. This could lead to a displacement of workers in certain industries, as companies look to cut costs by automating processes. While it is difficult to predict the exact extent of this trend, it is clear that AI will have a significant impact on the job market for college-educated workers. It will be important for individuals to stay up to date on the latest developments in AI and to consider how their skills and expertise can be leveraged in a world where machines are increasingly able to perform many tasks.


Peggy Smedley Show: America's Cutting Edge: Machine Tools

#artificialintelligence

Peggy and Tony Schmitz, professor, University of Tennessee, Knoxville, and joint faculty, Oak Ridge National Laboratory, talk about the ACE (America’s Cutting Edge) program and what brought him to the University of Tennessee. He explains that East Tennessee is an exploding ecosystem right now. They also discuss: Why it has the ACE program and the importance of machine tools for defense and our economic security. How machine tool technology has improved and the challenge of the lack of workforce to make the best use of that equipment. The result of outsourcing manufacturing to other countries and what needs to happen next.   (11/8/22 - 796) IoT, Internet of Things, Peggy Smedley, artificial intelligence, machine learning, big data, digital transformation, cybersecurity, blockchain, 5G, cloud, sustainability, future of work, podcast, Tony Schmitz, University of Tennessee, Knoxville, Oak Ridge National Laboratory This episode is available on all major streaming platforms. If you enjoyed this segment, please consider leaving a review on Apple Podcasts.


ITRI helps machine tool makers hike product value

#artificialintelligence

Government-sponsored Industrial Technology Research Institute (ITRI) has developed a solution based on digital CNC (computer numerical control) to facilitate system integration and expansion of machine tools and robotic arms into work cells that carry higher product prices. Featuring openness and flexibility, the digital CNC allows introduction of value-added software to manufacturing processes and enables machine tool makers to develop proprietary software through SDK, which has helped makers develop various value-added functions such as CNC-embedded modules for simulating 3D cutting, main axis monitoring and remote monitoring, ITRI said. ITRI has cooperated with eight machine tool makers to use the solution in developing high value-added machine tools used in turning, turning plus milling and other machining processes. Through convenient integration of such machine tools into work cells, prices for a machine tool have risen from NT$1.4 million (US$48,440) originally to NT$2.0 million.